59 research outputs found

    How to Optimize Online Mixed-Device Surveys: The Effects of a Messenger Survey, Answer Scales, Devices and Personal Characteristics

    Get PDF
    The goal of this research was to determine the best way to present mixed-device surveys. We investigate the effect of survey method (messenger versus regular survey), answer scale, device used, and personal characteristics such as gender, age and education on break-off rate, substantive answers, completion time and respondents' evaluation of the survey. Our research does not suggest that a messenger survey affects mixed-device surveys positively. Further research is necessary to investigate how to optimally present mixed-device surveys in order to increase participation and data quality

    Can we predict device use? An investigation into mobile device use in surveys

    Get PDF
    In this study, we investigate whether mobile device use in surveys can be predicted. We aim to identify possible motives for device use and build a model by drawing on theory from technology acceptance research and survey research. We then test this model with a Structural Equation Modeling approach using data of seven waves of the GESIS panel. We test whether our theoretical model fits the data by focusing on measures of fit, and by studying the standardized effects of the model. Results reveal that intention to use a particular device can predict actual use quite well. Ease of smartphone use is the most meaningful variable: if people use a smartphone for specific tasks, their intention to use a smartphone for survey completion is also more likely. In conclusion, investing in ease of use of mobile survey completion could encourage respondents to use mobile devices. This can foremost be established by building well-designed surveys for mobile devices

    Adapting Surveys to the Modern World:Comparing a Research Messenger Design to a Regular Responsive Design for Online Surveys

    Get PDF
    Online surveys are increasingly completed on smartphones. There are several ways to structure online surveys so as to create an optimal experience for any screen size. For example, communicating through applications (apps) such as WhatsApp and Snapchat closely resembles natural turn-by-turn conversations between individuals. Web surveys currently mimic the design of paper questionnaires mostly, leading to a survey experience that may not be optimal when completed on smartphones. In this paper, we compare a research messenger design, which mimics a messenger app type of communication, to a responsive survey design. We investigate whether response quality is similar between the two designs and whether respondents' satisfaction with the survey is higher for either version. Our results show no differences for primacy effects, number of nonsubstantive answers, and dropout rate. The length of open-ended answers was shorter for the research messenger survey compared to the responsive design, and the overall time of completion was longer in the research messenger survey. The evaluation at the end of the survey showed no clear indication that respondents liked the research messenger survey more than the responsive design. Future research should focus on how to optimally design online mixed-device surveys in order to increase respondent satisfaction and data quality

    Understanding Willingness to Share Smartphone-Sensor Data

    Get PDF
    The growing smartphone penetration and the integration of smartphones into people’s everyday practices offer researchers opportunities to augment survey measurement with smartphone-sensor measurement or to replace self-reports. Potential benefits include lower measurement error, a widening of research questions, collection of in situ data, and a lowered respondent burden. However, privacy considerations and other concerns may lead to nonparticipation. To date, little is known about the mechanisms of willingness to share sensor data by the general population, and no evidence is available concerning the stability of willingness. The present study focuses on survey respondents’ willingness to share data collected using smartphone sensors (GPS, camera, and wearables) in a probability-based online panel of the general population of the Netherlands. A randomized experiment varied study sponsor, framing of the request, the emphasis on control over the data collection process, and assurance of privacy and confidentiality. Respondents were asked repeatedly about their willingness to share the data collected using smartphone sensors, with varying periods before the second request. Willingness to participate in sensor-based data collection varied by the type of sensor, study sponsor, order of the request, respondent’s familiarity with the device, previous experience with participating in research involving smartphone sensors, and privacy concerns. Willingness increased when respondents were asked repeatedly and varied by sensor and task. The timing of the repeated request, one month or six months after the initial request, did not have a significant effect on willingness

    Recruiting Young and Urban Groups into a Probability-Based Online Panel by Promoting Smartphone Use

    Get PDF
    A sizable minority of all web surveys are nowadays completed on smartphones. People who choose a smartphone for Internet-related tasks are different from people who mainly use a PC or tablet. Smartphone use is particularly high among the young and urban. We have to make web surveys attractive for smartphone completion in order not to lose these groups of smartphone users. In this paper we study how to encourage people to complete surveys on smartphones in order to attract hard-to-reach subgroups of the population. We experimentally test new features of a survey-friendly design: we test two versions of an invitation letter to a survey, a new questionnaire lay-out, and autoforwarding. The goal of the experiment is to evaluate whether the new survey design attracts more smartphone users, leads to a better survey experience on smartphones and results in more respondents signing up to become a member of a probability-based online panel. Our results show that the invitation letter that emphasizes the possibility for smartphone completion does not yield a higher response rate than the control condition, nor do we find differences in the socio-demographic background of respondents. We do find that slightly more respondents choose a smartphone for survey completion. The changes in the layout of the questionnaire do lead to a change in survey experience on the smartphone. Smartphone respondents need 20% less time to complete the survey when the questionnaire includes autoforwarding. However, we do not find that respondents evaluate the survey better, nor are they more likely to become a member of the panel when asked at the end of the survey. We conclude with a discussion of autoforwarding in web surveys and methods to attract smartphone users to web surveys

    Testing the Effects of Automated Navigation in a General Population Web Survey

    Get PDF
    This study investigates how an auto-forward design, where respondents navigate through a web survey automatically, affects response times and navigation behavior in a long mixed-device web survey. We embedded an experiment in a health survey administered to the general population in The Netherlands to test the auto-forward design against a manual-forward design. Analyses are based on detailed paradata that keep track of the respondents’ behavior in navigating the survey. We find that an auto-forward design decreases comple­tion times and that questions on pages with automated navigation are answered significant­ly faster compared to questions on pages with manual navigation. However, we also find that respondents use the navigation buttons more in the auto-forward condition compared to the manual-forward condition, largely canceling out the reduction in survey duration. Furthermore, we also find that the answer options 'I don't know' and 'I rather not say' are used just as often in the auto-forward condition as in the manual-forward condition, indi­cating no differences in satisficing behavior. We conclude that auto-forwarding can be used to reduce completing times, but we also advice to carefully consider mixing manual and auto-forwarding within a survey

    LISS panel > Religion and Ethnicity > Wave 1

    No full text
    This questionnaire is about religion and ethnicity and is part of the first wave of the LISS Core Study

    LISS panel > Politics and Values > Wave 1

    No full text
    The survey focuses on politics and values
    • …
    corecore